Terms and conditions of our future
The Facebook Papers, a series of documents leaked by whistleblower Frances Haugen, brim with revelations. The company appears to have been fully aware of its role in the dissemination of false information and anger-inducing content. Facebook knew about the negative impact of its service Instagram on the mental health of teenage users, and about its own role in stoking violence in developing countries. Moral philosopher Katleen Gabriels and data protection lawyer Paolo Balboni discuss the problems and possible solutions.
Data protection, privacy and cybersecurity are crucial to upholding the fundamental rights of individuals and preventing everything from fraud and blackmail to disasters like the Colonial Pipeline attack, which created chaos in the US earlier last year. And that’s not all, Balboni says. “A small number of digital service providers have tremendous power and influence over our lives and those of our children, who are growing up with social media. It’s vital that we think beyond convenience and look at the ethics of this system.”
Gabriels agrees that digital service providers (DSPs) have a profound influence. “In one of their emotion studies, Facebook tested 600,000 users—without their consent—to see how positive or negative biases in our newsfeed affect us. It turns out that we’re emotionally highly susceptible to these biases.” As she points out, it is hardly surprising that many tech CEOs and developers strictly limit their children’s use of digital technology.
Lawful—but (potentially) harmful
“DSPs need to adopt a clearly defined, sustainable role in global society, or we’ll have an immense problem on our hands,” says Balboni. At present, companies are beholden only to the law and their shareholders. “We need a fundamental shift in the approach to regulating DSPs, one that goes beyond legislative regulation.” As part of his research project Data Protection as a Corporate Social Responsibility, Balboni aims to create a Maastricht Digital Pact. This framework will call on organisations to improve their implementation of data protection and security and to fund awareness campaigns to educate citizens about privacy, data protection and cybersecurity.
Why not just have better laws? “Do we really believe in global, effective enforcement based on current regulations?” asks Balboni. He points out how companies approach the substantial fines for violations of the EU’s General Data Protection Regulation (GDPR). “Organisations can set 4% of their annual turnover aside as part of their business model, balancing potential fines against profits.”
No stick among the carrots
While Gabriels supports the notion of a Maastricht Digital Pact, she worries about ethical whitewashing. “Prior to the ‘Diesel-gate’ scandal, Volkswagen’s corporate social responsibility communication strategy centred on their ambition to become the world’s most sustainable company. Those vague pledges are often window dressing, so I think an ethical code is too light, especially when there are no painful consequences.” In her view, the pact has to be part of a concerted effort. “You could significantly raise the 4% cap for fines under the GDPR and enforce it more rigidly. You need more lawmakers and politicians well-versed in those topics, and much, much better education.”
Katleen Gabriels is a moral philosopher, specialised in computer and machine ethics. She is an assistant professor at the Faculty of Arts and Social Sciences and programme director of the interdisciplinary BA in Digital Society. Her research focuses on the co-shaping of morality and computer technologies. Gabriels is the author of Conscientious AI: Machines learning morals and How digitalisation shapes your life.
Balboni’s proposed framework does not rely on motivating tech giants purely through the chance to do good. The premise is that increased transparency and oversight would lead to a competitive disadvantage for organisations failing to adopt these best practices. “This approach relies on virtuous competition between companies on issues such as social responsibility and respecting users’ data.”
Quasi monopoly
“It’s true that people care about their reputations,” counters Gabriels, “but at the same time, if you effectively have a monopoly, there’s no real incentive to play by the rules. If people think they’ll get away with it, they’ll try to. It’s amazing how many scandals Facebook has had in the last decade, without suffering serious consequences.”
Facebook has, in effect, morally disengaged, stressing that it is a social platform and not a news agency, and thus not responsible for its content. Yet, supposedly neutral algorithms designed to maximise engagement will always promote the most incendiary messages. “Paolo is right to point out its tremendous societal responsibility, since many adults and teenagers now get their news from Facebook,” Gabriels continues. Conveniently renamed Meta Platforms Inc since the latest round of revelations, Facebook now also incorporates WhatsApp and Instagram, among other platforms. “Because they answer our social needs, users are effectively locked in, so there’s very little leverage to make them change.”
Public awareness and moral actors
Nonetheless, Gabriels points out how the tech giants buy up every potential competitor at a very early stage. While the myth of two university dropouts in a garage challenging Alphabet’s or Meta’s monopoly might be naïve, Balboni thinks that a good enough concept with the backing of a major investment fund could indeed drive competition around selling points like ethics, privacy and a socially responsible approach to digital business.
Gabriels also warns against becoming cynical or defeatist. “Some of these companies have tried to take on more responsibility after being pushed from both the outside and the inside. Most of these shifts were triggered by whistleblowers: insiders putting morals ahead of their careers—although admittedly those are the kind of people who can find new jobs easily.” As society figures out how best to integrate the technologies that will decisively shape it, awareness, vigilance and forward-thinking initiatives like Balboni’s are sorely needed.
By Florian Raith (text), Rafaël Philippen (image)
Paolo Balboni is professor of Privacy, Cybersecurity and IT Contract Law at the European Centre on Privacy and Cybersecurity (ECPC) at the Faculty of Law. He is involved in several EU-funded research projects on cybersecurity and privacy, and regularly advises governments, EU institutions and multinationals on these topics. He is a founding partner of the international law firm ICT Legal Consulting and author of Trustmarks in e-commerce: The value of web seals and the liability of their providers.
Also read
-
Empowering Smallholder Farmers in the Data Economy: Unlocking Opportunities and Overcoming Obstacles
Frederik Claasen, the head of policy at our partner organisation Solidaridad Network on the opportunities and obstacles facing smallholder farmers in their data ecosystems.
-
“In the field of diabetes 2 and cardiovascular disease prevention, the results of our research are a big step in the right direction". Does personalized nutrition have positive effects on health? "The answer to this question is a resounding yes," says Ellen Blaak, professor of human biology and working at NUTRIM (School of Nutrition and Translational Research in Metabolism).
-
Computers are already capable of making independent decisions in familiar situations. But can they also apply knowledge to new facts? Mark Winands, the new professor of Machine Reasoning at the Department of Advanced Computing Sciences, develops computer programs that behave as rational agents.